Penalized Regression, Standard Errors, and Bayesian Lassos
نویسندگان
چکیده
Penalized regression methods for simultaneous variable selection and coefficient estimation, especially those based on the lasso of Tibshirani (1996), have received a great deal of attention in recent years, mostly through frequentist models. Properties such as consistency have been studied, and are achieved by different lasso variations. Here we look at a fully Bayesian formulation of the problem, which is flexible enough to encompass most versions of the lasso that have been previously considered. The advantages of the hierarchical Bayesian formulations are many. In addition to the usual ease-of-interpretation of hierarchical models, the Bayesian formulation produces valid standard errors (which can be problematic for the frequentist lasso), and is based on a geometrically ergodic Markov chain. We compare the performance of the Bayesian lassos to their frequentist counterparts using simulations, data sets that previous lasso papers have used, and a difficult modeling problem for predicting the collapse of governments around the world. In terms of prediction mean squared error, the Bayesian lasso performance is similar to and, in some cases, better than, the frequentist lasso.
منابع مشابه
Comparison of Maximum Likelihood Estimation and Bayesian with Generalized Gibbs Sampling for Ordinal Regression Analysis of Ovarian Hyperstimulation Syndrome
Background and Objectives: Analysis of ordinal data outcomes could lead to bias estimates and large variance in sparse one. The objective of this study is to compare parameter estimates of an ordinal regression model under maximum likelihood and Bayesian framework with generalized Gibbs sampling. The models were used to analyze ovarian hyperstimulation syndrome data. Methods: This study use...
متن کاملSpatially Adaptive Bayesian Penalized Splines With Heteroscedastic Errors
Penalized splines have become an increasingly popular tool for nonparametric smoothing because of their use of low-rank spline bases, which makes computations tractable while maintaining accuracy as good as smoothing splines. This article extends penalized spline methodology by both modeling the variance function nonparametrically and using a spatially adaptive smoothing parameter. This combina...
متن کاملPenalized Bregman Divergence Estimation via Coordinate Descent
Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores...
متن کاملBayesian Analysis for Penalized Spline Regression Using WinBUGS
Penalized splines can be viewed as BLUPs in a mixed model framework, which allows the use of mixed model software for smoothing. Thus, software originally developed for Bayesian analysis of mixed models can be used for penalized spline regression. Bayesian inference for nonparametric models enjoys the flexibility of nonparametric models and the exact inference provided by the Bayesian inferenti...
متن کاملBayesian perspectives for epidemiological research. II. Regression analysis.
This article describes extensions of the basic Bayesian methods using data priors to regression modelling, including hierarchical (multilevel) models. These methods provide an alternative to the parsimony-oriented approach of frequentist regression analysis. In particular, they replace arbitrary variable-selection criteria by prior distributions, and by doing so facilitate realistic use of impr...
متن کامل